237 research outputs found

    Automated analysis of security protocols with global state

    Get PDF
    Security APIs, key servers and protocols that need to keep the status of transactions, require to maintain a global, non-monotonic state, e.g., in the form of a database or register. However, most existing automated verification tools do not support the analysis of such stateful security protocols - sometimes because of fundamental reasons, such as the encoding of the protocol as Horn clauses, which are inherently monotonic. A notable exception is the recent tamarin prover which allows specifying protocols as multiset rewrite (msr) rules, a formalism expressive enough to encode state. As multiset rewriting is a "low-level" specification language with no direct support for concurrent message passing, encoding protocols correctly is a difficult and error-prone process. We propose a process calculus which is a variant of the applied pi calculus with constructs for manipulation of a global state by processes running in parallel. We show that this language can be translated to msr rules whilst preserving all security properties expressible in a dedicated first-order logic for security properties. The translation has been implemented in a prototype tool which uses the tamarin prover as a backend. We apply the tool to several case studies among which a simplified fragment of PKCS\#11, the Yubikey security token, and an optimistic contract signing protocol

    Transforming Password Protocols to Compose

    Get PDF
    Formal, symbolic techniques are extremely useful for modelling and analysing security protocols. They improved our understanding of security protocols, allowed to discover flaws, and also provide support for protocol design. However, such analyses usually consider that the protocol is executed in isolation or assume a bounded number of protocol sessions. Hence, no security guarantee is provided when the protocol is executed in a more complex environment. In this paper, we study whether password protocols can be safely composed, even when a same password is reused. More precisely, we present a transformation which maps a password protocol that is secure for a single protocol session (a decidable problem) to a protocol that is secure for an unbounded number of sessions. Our result provides an effective strategy to design secure password protocols: (i) design a protocol intended to be secure for one protocol session; (ii) apply our transformation and obtain a protocol which is secure for an unbounded number of sessions. Our technique also applies to compose different password protocols allowing us to obtain both inter-protocol and inter-session composition

    Fair Exchange in Strand Spaces

    Full text link
    Many cryptographic protocols are intended to coordinate state changes among principals. Exchange protocols coordinate delivery of new values to the participants, e.g. additions to the set of values they possess. An exchange protocol is fair if it ensures that delivery of new values is balanced: If one participant obtains a new possession via the protocol, then all other participants will, too. Fair exchange requires progress assumptions, unlike some other protocol properties. The strand space model is a framework for design and verification of cryptographic protocols. A strand is a local behavior of a single principal in a single session of a protocol. A bundle is a partially ordered global execution built from protocol strands and adversary activities. The strand space model needs two additions for fair exchange protocols. First, we regard the state as a multiset of facts, and we allow strands to cause changes in this state via multiset rewriting. Second, progress assumptions stipulate that some channels are resilient-and guaranteed to deliver messages-and some principals are assumed not to stop at certain critical steps. This method leads to proofs of correctness that cleanly separate protocol properties, such as authentication and confidentiality, from invariants governing state evolution. G. Wang's recent fair exchange protocol illustrates the approach

    Formal Models and Techniques for Analyzing Security Protocols: A Tutorial

    Get PDF
    International audienceSecurity protocols are distributed programs that aim at securing communications by the means of cryptography. They are for instance used to secure electronic payments, home banking and more recently electronic elections. Given The financial and societal impact in case of failure, and the long history of design flaws in such protocol, formal verification is a necessity. A major difference from other safety critical systems is that the properties of security protocols must hold in the presence of an arbitrary adversary. The aim of this paper is to provide a tutorial to some modern approaches for formally modeling protocols, their goals and automatically verifying them

    Contingent payments on a public ledger: models and reductions for automated verification

    Get PDF
    International audienceWe study protocols that rely on a public ledger infrastructure, concentrating on protocols for zero-knowledge contingent payment, whose security properties combine diverse notions of fairness and privacy. We argue that rigorous models are required for capturing the ledger semantics, the protocol-ledger interaction, the cryptographic primitives and, ultimately, the security properties one would like to achieve.Our focus is on a particular level of abstraction, where network messages are represented by a term algebra, protocol execution by state transition systems (e.g. multiset rewrite rules) and where the properties of interest can be analyzed with automated verification tools. We propose models for: (1) the rules guiding the ledger execution, taking the coin functionality of public ledgers such as Bitcoin as an example; (2) the security properties expected from ledger-based zero-knowledge contingent payment protocols; (3) two different security protocols that aim at achieving these properties relying on different ledger infrastructures; (4) reductions that allow simpler term algebras for homomorphic cryptographic schemes.Altogether, these models allow us to derive a first automated verification for ledger-based zero-knowledge contingent payment using the Tamarin prover. Furthermore , our models help in clarifying certain underlying assumptions, security and efficiency tradeoffs that should be taken into account when deploying protocols on the blockchain

    Automated verification of equivalence properties of cryptographic protocols

    Get PDF
    The original publication is available at www.springerlink.comInternational audienceIndistinguishability properties are essential in formal verification of cryptographic protocols. They are needed to model anonymity of cryptographic protocols. They are needed to model anonymity properties, strong versions of confidentiality and resistance to offline guessing attacks, and can be conveniently modeled using process equivalences. We present a novel procedure to verify equivalence properties for bounded number of sessions. Our procedure is able to verify trace equivalence for determinate cryptographic protocols. On determinate protocols, trace equivalence coincides with observational equivalence which can therefore be automatically verified for such processes. When protocols are not determinate our procedure can be used for both under- and over-approximations of trace equivalence, which proved successful on examples. The procedure can handle a large set of cryptographic primitives, namely those which can be modeled by an optimally reducing convergent rewrite system. Although, we were unable to prove its termination, it has been implemented in a prototype tool and has been effectively tested on examples, some of which were outside the scope of existing tools

    Computing knowledge in security protocols under convergent equational theories

    Get PDF
    International audienceThe analysis of security protocols requires reasoning about the knowledge an attacker acquires by eavesdropping on network traffic. In formal approaches, the messages exchanged over the network are modeled by a term algebra equipped with an equational theory axiomatizing the properties of the cryptographic primitives (e.g. encryption, signature). In this context, two classical notions of knowledge, deducibility and indistinguishability, yield corresponding decision problems.\par We propose a procedure for both problems under arbitrary convergent equational theories. Since the underlying problems are undecidable we cannot guarantee termination. Nevertheless, our procedure terminates on a wide range of equational theories. In particular, we obtain a new decidability result for a theory we encountered when studying electronic voting protocols. We also provide a prototype implementation

    Transforming Password Protocols to Compose

    Get PDF
    International audienceFormal, symbolic techniques are extremely useful for modelling and analysing security protocols. They improved our understanding of security protocols, allowed to discover flaws, and also provide support for protocol design. However, such analyses usually consider that the protocol is executed in isolation or assume a bounded number of protocol sessions. Hence, no security guarantee is provided when the protocol is executed in a more complex environment. In this paper, we study whether password protocols can be safely composed, even when a same password is reused. More precisely, we present a transformation which maps a password protocol that is secure for a single protocol session (a decidable problem) to a protocol that is secure for an unbounded number of sessions. Our result provides an effective strategy to design secure password protocols: (i) design a protocol intended to be secure for one protocol session; (ii) apply our transformation and obtain a protocol which is secure for an unbounded number of sessions. Our technique also applies to compose different password protocols allowing us to obtain both inter-protocol and inter-session composition

    A Survey of Symbolic Methods in Computational Analysis of Cryptographic Systems

    Get PDF
    Since the 1980s, two approaches have been developed for analyzing security protocols. One of the approaches relies on a computational model that considers issues of complexity and probability. This approach captures a strong notion of security, guaranteed against all probabilistic polynomial-time attacks. The other approach relies on a symbolic model of protocol executions in which cryptographic primitives are treated as black boxes. Since the seminal work of Dolev and Yao, it has been realized that this latter approach enables significantly simpler and often automated proofs. However, the guarantees that it offers have been quite unclear. For more than twenty years the two approaches have coexisted but evolved mostly independently. Recently, significant research efforts attempt to develop paradigms for cryptographic systems analysis that combines the best of both worlds. There are two broad directions that have been followed. {\em Computational soundness} aims to establish sufficient conditions under which results obtained using symbolic models imply security under computational models. The {\em direct approach} aims to apply the principles and the techniques developed in the context of symbolic models directly to computational ones. In this paper we survey existing results along both of these directions. Our goal is to provide a rather complete summary that could act as a quick reference for researchers who want to contribute to the field, want to make use of existing results, or just want to get a better picture of what results already exist

    DeepSec: Deciding Equivalence Properties for Security Protocols -- Improved theory and practice

    Full text link
    Automated verification has become an essential part in the security evaluation of cryptographic protocols. In this context privacy-type properties are often modelled by indistinguishability statements, expressed as behavioural equivalences in a process calculus. In this paper we contribute both to the theory and practice of this verification problem. We establish new complexity results for static equivalence, trace equivalence and labelled bisimilarity and provide a decision procedure for these equivalences in the case of a bounded number of protocol sessions. Our procedure is the first to decide trace equivalence and labelled bisimilarity exactly for a large variety of cryptographic primitives -- those that can be represented by a subterm convergent destructor rewrite system. We also implemented the procedure in a new tool, DeepSec. We showed through extensive experiments that it is significantly more efficient than other similar tools, while at the same time raises the scope of the protocols that can be analysed.Comment: 104 page
    • …
    corecore